Mutual Information Gain based Test Suite Reduction

نویسندگان

  • Meenu Dave
  • Rashmi Agrawal
  • Mary Jean Harrold
  • R. A. DeMillo
  • D. S. Guindi
  • K. N. King
  • W. M. McCracken
چکیده

The test suite optimization during test case generation can save time and cost. The paper presents an information theory based metric to filter the redundant test cases and reduce the test suite size while, maintaining the coverage of the requirements and with minimum loss to mutant coverage. The paper propose two versions, RR and RR2. RR filters test cases for each requirement, where as, RR2 filters till the target coverage is achieved. The paper suggests the time and phase for the implementation of the algorithms, based on results. The results show that the proposed algorithms are effective at optimizing the testing process by saving time and resource.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Comparative Study on Feature Selection Methods for Drug Discovery

Feature selection is frequently used as a preprocessing step to machine learning. The removal of irrelevant and redundant information often improves the performance of learning algorithms. This paper is a comparative study of feature selection in drug discovery. The focus is on aggressive dimensionality reduction. Five methods were evaluated, including information gain, mutual information, a ch...

متن کامل

Information Measures for Object Recognition

We have been studying information theoretic measures, entropy and mutual information, as performance bounds on the information gain given a standard suite of sensors. Object pose is described by a single angle of rotation using a Lie group parameterization; observations are simulated using CAD models for the targets of interest and simulators such as the PRISM infrared simulator. Variability in...

متن کامل

Text Classification Using Small Number of Features

Feature selection method for text classification based on information gain ranking, improved by removing redundant terms using mutual information measure and inclusion index, is proposed. We report an experiment to study the impact of term redundancy on the performance of text classifier. The result shows that term redundancy behaves very similar to noise and may degrade the classifier performa...

متن کامل

Information Gain in Object Recognition Via Sensor Fusion

We have been studying information theoretic measures, entropy and mutual information , as performance metrics on the information gain given a standard suite of sensors. Object pose is described by a single angle of rotation using a Lie group parameterization; observations are generated using CAD models for the targets of interest and simulators. Variability in the data due to the sensor by whic...

متن کامل

On Classification of Bivariate Distributions Based on Mutual Information

Among all measures of independence between random variables, mutual information is the only one that is based on information theory. Mutual information takes into account of all kinds of dependencies between variables, i.e., both the linear and non-linear dependencies. In this paper we have classified some well-known bivariate distributions into two classes of distributions based on their mutua...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017